21 |
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
|
|
|
|
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
|
|
BASE
|
|
Show details
|
|
22 |
Measuring Terminology Consistency in Translated Corpora: Implementation of the Herfindahl-Hirshman Index
|
|
|
|
In: Information; Volume 13; Issue 2; Pages: 43 (2022)
|
|
BASE
|
|
Show details
|
|
23 |
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
|
|
|
|
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
|
|
BASE
|
|
Show details
|
|
24 |
The Role of Task Complexity and Dominant Articulatory Routines in the Acquisition of L3 Spanish
|
|
|
|
In: Languages; Volume 7; Issue 2; Pages: 90 (2022)
|
|
BASE
|
|
Show details
|
|
25 |
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
|
|
|
|
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
|
|
Abstract:
We consider neural sign language translation: machine translation from signed to written languages using encoder–decoder neural networks. Translating sign language videos to written language text is especially complex because of the difference in modality between source and target language and, consequently, the required video processing. At the same time, sign languages are low-resource languages, their datasets dwarfed by those available for written languages. Recent advances in written language processing and success stories of transfer learning raise the question of how pretrained written language models can be leveraged to improve sign language translation. We apply the Frozen Pretrained Transformer (FPT) technique to initialize the encoder, decoder, or both, of a sign language translation model with parts of a pretrained written language model. We observe that the attention patterns transfer in zero-shot to the different modality and, in some experiments, we obtain higher scores (from 18.85 to 21.39 BLEU-4). Especially when gloss annotations are unavailable, FPTs can increase performance on unseen data. However, current models appear to be limited primarily by data quality and only then by data quantity, limiting potential gains with FPTs. Therefore, in further research, we will focus on improving the representations used as inputs to translation models.
|
|
Keyword:
machine translation; sign language translation; transfer learning
|
|
URL: https://doi.org/10.3390/info13050220
|
|
BASE
|
|
Hide details
|
|
26 |
Analyzing COVID-19 Medical Papers Using Artificial Intelligence: Insights for Researchers and Medical Professionals
|
|
|
|
In: Big Data and Cognitive Computing; Volume 6; Issue 1; Pages: 4 (2022)
|
|
BASE
|
|
Show details
|
|
27 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
|
|
|
|
BASE
|
|
Show details
|
|
30 |
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
|
|
|
|
BASE
|
|
Show details
|
|
31 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
34 |
Toward an Epistemic Web
|
|
|
|
In: 197 ; RatSWD Working Paper Series ; 22 (2022)
|
|
BASE
|
|
Show details
|
|
35 |
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
|
|
|
|
BASE
|
|
Show details
|
|
37 |
An Empirical Study of Factors Affecting Language-Independent Models
|
|
|
|
BASE
|
|
Show details
|
|
38 |
„A Hund is er scho’“. Die Migration eines Ausdrucks und seine bayerisch-ungarische Transfergeschichte
|
|
|
|
BASE
|
|
Show details
|
|
39 |
Neural-based Knowledge Transfer in Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
40 |
Chinese Idioms: Stepping Into L2 Student’s Shoes
|
|
|
|
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
|
|
BASE
|
|
Show details
|
|
|
|